# 8192 Context Length
Llm Jp Modernbert Base
Apache-2.0
A Japanese large language model based on the modernBERT-base architecture, supporting a maximum sequence length of 8192, trained on 3.4TB of Japanese corpus
Large Language Model
Transformers Japanese

L
llm-jp
1,398
5
Modernbert Base Squad2 V0.2
Apache-2.0
QA model fine-tuned from ModernBERT-base-nli, supporting long-context processing
Question Answering System
Transformers

M
Praise2112
42
2
Gte En Mlm Base
Apache-2.0
An English text encoder from the GTE-v1.5 series, improved based on the BERT architecture, supporting context lengths up to 8192, suitable for English text representation tasks.
Text Embedding Supports Multiple Languages
G
Alibaba-NLP
231
7
Featured Recommended AI Models